Evaluating Rater Quality and Rating Difficulty in Online Annotation Activities

نویسندگان

  • Peter Organisciak
  • Miles Efron
  • Katrina Fenlon
  • Megan Senseney
چکیده

Gathering annotations from non-expert online raters is an attractive method for quickly completing large-scale annotation tasks, but the increased possibility of unreliable annotators and diminished work quality remains a cause for concern. In the context of information retrieval, where human-encoded relevance judgments underlie the evaluation of new systems and methods, the ability to quickly and reliably collect trustworthy annotations allows for quicker development and iteration of research. In the context of paid online workers, this study evaluates indicators of non-expert performance along three lines: temporality, experience, and agreement. It is found that user performance is a key indicator for future performance. Additionally, the time spent by raters familiarizing themselves with a new set of tasks is important for rater quality, as is long-term familiarity with a topic being rated. These findings may inform large-scale digital collections’ use of non-expert raters for performing more purposive and affordable online annotation activities.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Inter-sentential Coreferences in Semantic Networks: An Evaluation of Manual Annotation

We present an evaluation of inter-sentential coreference annotation in the context of manually created semantic networks. The semantic networks are constructed independently be each annotator and require an entity mapping priori to evaluating the coreference. We present the model used for mapping the semantic entities as well as the algorithm used for our evaluation task. We present the raw sta...

متن کامل

Finding your "Inner-Annotator": An Experiment in Annotator Independence for Rating Discourse Coherence Quality in Essays

An experimental annotation method is described, showing promise for a subjective labeling task – discourse coherence quality of essays. Annotators developed personal protocols, reducing front-end resources: protocol development and annotator training. Substantial inter-annotator agreement was achieved for a 4-point scale. Correlational analyses revealed how unique linguistic phenomena were cons...

متن کامل

Iranian EFL Learners L2 Reading Comprehension: The Effect of Online Annotations via Interactive White Boards

This study explores the effect of online annotations via Interactive White Boards (IWBs) on reading comprehension of Iranian EFL learners. To this aim, 60 students from a language institute were selected as homogeneous based on their performance on Oxford Placement Test (2014).Then, they were randomly assigned to 3 experimental groups of 20, and subsequently exposed to the research treatment af...

متن کامل

The virtual focus group: a modern methodology for facial attractiveness rating.

BACKGROUND Traditional focus groups have been essential to facial aesthetics research. Although they are currently the criterion standard in acquiring facial attractiveness ratings, they retain many shortcomings. This study's objectives were twofold: to determine whether attractiveness scores obtained from a social network site correlate with those from the traditional focus group method; and t...

متن کامل

Best-Worst Scaling More Reliable than Rating Scales: A Case Study on Sentiment Intensity Annotation

Rating scales are a widely used method for data annotation; however, they present several challenges, such as difficulty in maintaining interand intra-annotator consistency. Best–worst scaling (BWS) is an alternative method of annotation that is claimed to produce high-quality annotations while keeping the required number of annotations similar to that of rating scales. However, the veracity of...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012